Critical Care
○ Springer Science and Business Media LLC
Preprints posted in the last 90 days, ranked by how well they match Critical Care's content profile, based on 14 papers previously published here. The average preprint has a 0.09% match score for this journal, so anything above that is already an above-average fit.
Boström, L.; Hagström, S.; Engström, J.; Larsson, A. O.; Friberg, H.; Lengquist, M.; Frigyesi, A.
Show abstract
BackgroundSepsis is a major public health challenge, and reliable biomarkers are essential for distinguishing sepsis from other conditions. Neutrophil Gelatinase-Associated Lipocalin (Neutrophil gelatinase-associated lipocalin (NGAL)) has shown promise as a diagnostic marker due to its role in the immune response. This study evaluates plasma NGAL as a diagnostic tool at the time of ICU admission. MethodsWe analysed plasma NGAL and C-reactive protein (CRP) levels in 4732 adult patients admitted to four ICUs between 2015 and 2018. All patients were retrospectively screened for Sepsis-3 criteria at ICU admission. The discriminative performance of NGAL and CRP for sepsis was assessed using receiver operating characteristic (ROC) analysis, with NGAL levels adjusted for Chronic kidney disease (CKD) and age. Patients were stratified by renal function. ResultsPlasma NGAL levels were significantly higher in septic patients (p<0.001). For the whole cohort, NGAL alone yielded an Area under the curve (AUC) of 0.67 (Confidence interval (CI) 0.66-0.69), CRP yielded an AUC of 0.72 (CI 0.71-0.73, p<0.001), and combining NGAL with CRP nominally improved discriminative performance (AUC 0.74 vs 0.72, p<0.001). Stratified analyses indicated that NGAL, together with CRP, significantly outperformed CRP alone in patients with no kidney injury and those with Acute Kidney Injury (AKI) only. In contrast, differences were not significant in patients with CKD only or CKD and AKI. ConclusionIn this large cohort, NGAL showed modest discrimination for sepsis, with a nominal improvement when combined with CRP. These findings do not indicate that NGAL meaningfully improves sepsis diagnosis in the ICU.
Wanka, S.-T.; Zilberszac, R.; Hermann, A.; Lenz, M.; Hengstenberg, C.; Schellongowski, P.; Staudinger, T.
Show abstract
BackgroundEarly lactate is widely used to risk-stratify septic shock, yet clinically actionable cut-offs for 28-day mortality remain uncertain. MethodsIn a single-centre study conducted across two intensive care units, we analysed 84 adults with septic shock identified within 24 hours of intensive care unit admission. The primary endpoint was 28-day mortality. Four lactate metrics obtained during the first 24 hours were evaluated: first (admission) lactate, last lactate, peak lactate, and lactate clearance from first to last. Associations were tested using logistic regression with and without adjustment for the Simplified Acute Physiology Score 3; discrimination was assessed by area under the receiver-operating characteristic curve (AUROC), and optimal cut-offs were defined by the Youden index. ResultsThirty-nine of 84 patients (46.4%) died by day 28. Higher absolute lactate values were independently associated with death (adjusted odds ratio (OR) per 1 mmol/L increase: First 1.47, p<0.001; Last 1.41, p=0.002; Peak 1.39, p<0.001), whereas Lactate clearance was not (OR 0.65, p=0.202). Discrimination was moderate to good for peak (AUROC 0.817), first (0.791), and last (0.757) lactate, and poor for clearance (0.577). Youden-derived thresholds provided pragmatic trade-offs: First 3.55 mmol/L (sensitivity 0.821, specificity 0.689), Last 3.15 mmol/L (0.567, 0.864), and Peak 3.55 mmol/L (0.973, 0.556). Kaplan-Meier curves using these cut-offs showed early and sustained separation. ConclusionsIn adults with septic shock, simple early lactate thresholds around 3.3- 3.6 mmol/L (first/peak) and approximately 3.15 mmol/L (last) identify 28-day mortality risk and outperform lactate clearance.
Farina Gonzalez, T. F.; Martinez Sagasti, F.; Hernando, M. E.; Oropesa, I.; Nunez-Reiz, A.; Gonzalez-Gallego, M. A.; Latorre, J.; Quintana-Diaz, M.
Show abstract
Purposeto describe HRV metrics in Covid-19 patients at admission in the ICU and its relationship with mortality and invasive mechanical ventilation (IMV). Heart rate variability (HRV) in sitting position in critically ill patients has not been explored. Methodswe conducted a prospective single-centre observational study. Adult patients admitted in the ICU with respiratory failure due to RT-PCR-confirmed SARS-CoV-2 but not under IMV were included. Electrocardiogram was recorded at least for 15 minutes at 500 Hz during a stable sitting condition. Power spectrum was obtained using wavelets. Very low frequency (VLF), low frequency (LF) and high frequency (HF) powerbands were calculated and then normalized to total power (VLFn%, LFn% and HFn%). We also analyzed non-linear HRV dynamics. Results27 patients were included. LFn% was lower in non-survivors (4.5 vs 8 %, p=0.015) and related to 28-day mortality (OR 0.61; 95% CI: 0.32 to 0.9, P=0.05). In a robust generalized Gamma linear model, we found that detrended fluctuation analysis alpha 2 (DFA2) (RR 0.092; bootstrapped 95% CI: 0.031-0.361, P=0.003), admission APACHE II score (RR 1.084; bootstrapped 95% CI: 1.046 to 1.123, P=0.002) and SAF index (RR 0.985; bootstrapped 95% CI: 0.982 - 0.990, P<0.001) were associated with longer ICU LOS. Conclusionsdiminished normalized LF power was associated with 28-day mortality in univariate analysis among critically ill COVID-19 patients on spontaneous ventilation, potentially reflecting an altered autonomic response in early severe COVID-19. Normalized and absolute VLF power should be considered when analyzing HRV in ICU patients. DFA2 was the HRV variable with the strongest association with ICU LOS. These exploratory results could be helpful to design newer tools for early prognostication in COVID-19 patients.
Chen, D.; Jiang, Q.; Shi, Z.; Yang, Y.; Liu, L.; Lei, X.; Zhang, C.
Show abstract
PurposeSepsis-associated immunothrombosis significantly contributes to high mortality, yet the role of N-glycosylation in this process remains poorly understood. This study aimed to comprehensively profile the plasma N-glycosylation landscape in sepsis and elucidate how its specific reprogramming in the complement and coagulation cascades influences immunothrombotic balance and patient outcomes. MethodsWe performed in-depth 4D-DIA proteomic and N-glycomic analyses on plasma from 43 sepsis patients and 9 healthy controls. Differential expression, weighted gene co-expression network analysis (WGCNA), and protein-glycosylation correlation analyses were used to characterize molecular features. Clinical relevance was assessed via correlation and survival analyses. ResultsExtensive N-glycosylation reprogramming was observed in sepsis plasma,with marked enrichment in complement and coagulation pathways(KEGG p=7.76x10- {superscript 2}{superscript 1}).Pro-coagulant proteins(eg,vWF,fibrinogen)showed increased abundance together with enhanced site-specific glycosylation,potentially amplifying their activity.In contrast,key anticoagulant proteins(eg,SERPINC1)displayed unchanged glycosylation at critical sites despite abundance changes,which may impair function.Survival analysis revealed distinct prognostic values of glycoproteins and specific glycosylation sites.For instance,high vWF protein levels predicted mortality(HR=2.83),whereas elevated glycosylation at vWF N211 was associated with improved survival(HR=0.135),suggesting a negative regulatory role.These glycosylation markers correlated closely with disease severity and prognosis,representing potential early-warning biomarkers independent of current clinical coagulation indicators. ConclusionOur study demonstrates widespread reprogramming of the plasma proteome and N-glycome in sepsis.We propose that decoupling of protein function from abundance through N-glycosylation in the complement-coagulation network contributes to immunothrombotic imbalance.Specific N-glycosylation sites may serve as novel prognostic biomarkers,offering new perspectives for early risk stratification and glycosylation-targeted therapies in sepsis. Key PointsO_LISepsis plasma exhibits specific N-glycosylation reprogramming overwhelmingly focused on the complement and coagulation cascade. C_LIO_LIA dominant "glycosylation-dominated co-upregulation" mode in procoagulant factors, coupled with a "silent" glycosylation state in key anticoagulants, drives prothrombotic imbalance. C_LIO_LISite-specific N-glycosylation levels provide prognostic information distinct from, and often superior to, their carrier protein abundance, offering novel early-risk biomarkers. C_LI
Krishnan, P.; Sikora, A.; Murray, B.; Ali, A.; Podgoreanu, M.; Upadhyaya, P.; Gent, A.; CHOUDHARY, T.; Holder, A. L.; Esper, A.; Kamaleswaran, R.
Show abstract
RationaleAutonomic dysfunction is a hallmark of sepsis pathophysiology, yet its quantification remains challenging. Multiscale entropy (MSE) derived from heart rate variability (HRV) offers a dynamic measure of physiological complexity and may serve as a biomarker of early deterioration associated with subsequent organ failure, vasopressor escalation, or mortality. ObjectiveTo determine whether MSE computed across multiple temporal scales during the first 24 hours of Intensive Care Unit (ICU) admission is associated with short-term mortality and longer-term organ dysfunction in patients with sepsis, and whether these relationships vary across vasopressor exposure. Unlike prior studies that focused on short-term HRV metrics, we applied MSE across multiple temporal scales and incorporated these features into machine learning models to evaluate their prognostic utility in septic shock. MethodsThis retrospective cohort study included adult ICU sepsis patients at Emory University Hospital from January 2016 to December 2019. Of 2,076 eligible patients, 958 were propensity matched into two cohorts: fluids-only and fluids-plus-vasopressor, with norepinephrine as the primary vasopressor. High-resolution electrocardiogram (ECG) waveforms were analyzed to compute MSE across 20 temporal scales. Machine learning models using (1) MSE features alone and (2) MSE combined with demographic and vital sign data (MSE-DV) were compared against traditional HRV measures based model and severity of illness scores for predicting outcomes. Model performance was assessed using the area under the receiver operating characteristic curve (AUROC), with a primary outcome of mortality at day 7 and secondary outcome of persistent organ dysfunction at day 28. ResultsIn the fluids-plus-vasopressor cohort, MSE-based models demonstrated superior predictive performance for 7-day mortality (AUROC 0.84) compared to severity of illness scores (AUROC 0.64). MSE-DV models also predicted organ dysfunction including 28-day renal (AUROC 0.75), neurological (AUROC 0.79), and respiratory (AUROC 0.71) dysfunction. Patients receiving second-line and third-line vasopressors and corticosteroids exhibited progressively lower MSE values, particularly at mid-range and long-range scales. ConclusionMSE features in the first 24 hours of ICU stay predict mortality and organ dysfunction with higher discrimination than traditional severity of illness scores. Future work should validate these findings, assess longitudinal MSE trends, and race-specific autonomic patterns to refine predictive models.
Yawata, S.; Uchino, S.; Yamashima, S.; Nishiyama, S.; Ono, S.; Sasabuchi, Y.; Katayama, S.
Show abstract
BackgroundThe role of arterial blood gas (ABG) testing in the intensive care unit (ICU) remains debated within the "less is more" paradigm. While unnecessary testing may pose risks without benefit, timely ABGs provide critical information in unstable patients. Institutional variation in early ABG utilization and its association with outcomes remains unclear. MethodsWe conducted a multicenter retrospective cohort study using the Japanese Intensive Care PAtient Database (JIPAD) between April 2015 and March 2023. Adult ICU patients with a stay [≥]24 h and arterial line placement were included. The standardized number of ABGs (SNABGs) within the first 24 h was calculated as the ratio of observed to expected values, where expectations were derived from a multivariable model adjusting for patient covariates. ICUs were categorized into tertiles according to SNABG utilization. The primary outcome was in-hospital mortality, analyzed using multilevel logistic regression with ICU-level random intercepts. Restricted cubic splines were used to explore non-linear associations. ResultsAmong 117,546 patients from 87 ICUs, the mean number of ABGs varied widely. After standardization, SNABGs ranged from 0.73-0.90 in the low tertile to 1.09-1.15 in the high tertile. In the multilevel model, SNABG was not significantly associated with in-hospital mortality (adjusted OR 0.942 [95% CI 0.807-1.100] for tertile 2; 0.874 [95% CI 0.751-1.017] for tertile 3). Flexible modeling suggested a non-linear trend toward better outcomes with higher utilization, but confidence intervals included unity. ConclusionEarly ABG utilization varied across ICUs, yet was not significantly associated with mortality. Sensitivity analysis suggested a non-linear relationship, with a tendency toward better outcomes at higher utilization. These findings warrant further investigation to clarify the role of early ABG utilization in critical care.
Zhang, S.; Joosten, S.; Boers, L. S.; van den Heuvel, H.; Dekker, T.; Davison, R.; Garcia Vallejo, J. J.; van der Poll, T.; Duitman, J.; Bos, L.
Show abstract
In this study, we provide a comprehensive characterization of the alveolar immune landscape in patients suffering from severe acute respiratory failure, predominantly caused by pneumonia or acute respiratory distress syndrome, conditions defined by intense pulmonary inflammation and immune dysregulation. Despite diverse underlying causes, the overall composition of alveolar immune cells was largely consistent, with neutrophils and macrophages comprising the majority of cells. However, the maturation and activation states of immune cell subsets varied significantly, not only between patients with and without pneumonia, but also among pneumonia cases stratified by pathogen type. We also observed dynamic shifts in immune cell subsets over the disease course and found that an increased proportion of CD123bright immature neutrophils and a reduction in alveolar resident macrophages were associated with increased 28-day mortality. Integration with alveolar cytokine profiles revealed strong correlations between immune cell populations and the local cytokine milieu. These findings highlight the importance of assessing immune cell function, not merely abundance, through broad and longitudinal investigation to better understand the pathophysiology of acute respiratory failure and to guide precision immunomodulatory therapy.
Chiriac, U.; Muenchow, M.; Roehr, A. C.; Frey, O. R.; Frey, A. T.; Frisch, D.; Gaasch, M.; Weigand, M. A.; Wicha, S.; Brinkmann, A.
Show abstract
ObjectivesTo provide real-world evidence on piperacillin exposure and outcomes in critically ill patients following the implementation of pharmacokinetic (PK)/pharmacodynamic (PD)-guided dosing in routine care. MethodsThis retrospective observational study included critically ill adults who received continuous piperacillin/tazobactam infusion between 2011 and 2019. Empiric doses were individualized using dosing software based on renal function and subsequently adjusted according to therapeutic drug monitoring (TDM) results. Drug exposure was defined as subtherapeutic (<32 mg/L), therapeutic (32-64 mg/L), moderately high (64-96 mg/L), or supratherapeutic (>96 mg/L). ResultsA total of 1538 critically ill patients with severe infections and sepsis of varying severity were included, and 3,090 piperacillin serum concentrations were analysed. Median daily piperacillin dose was 8,000 mg, median steady-state concentration 55 mg/L, and median clearance 6.25 L/h. At the first measurement, individualized empiric dosing resulted in 45.7% of patients being within the therapeutic range; after TDM-guided adjustment, target attainment increased to 62.4%. Subtherapeutic and supratherapeutic concentrations were uncommon among all TDM samples collected during individualized dosing (<32 mg/L: 12.8%; < 16 mg/L: 0.8%; > 96 mg/L: 11%). ICU mortality was 21.1 % in patients within the therapeutic range, 30.3 % in those with moderately high concentrations, and 44.1 % in those with supratherapeutic concentrations (p = 0.05). Women were 1.8 times more likely to present supratherapeutic concentrations. ConclusionsA multimodal approach combining individualized empiric dosing, TDM, and continuous infusion ensured target attainment while reducing drug consumption. These findings support the integration of individualized, PK/PD-guided dosing into routine care for critically ill patients and highlight the need for further studies addressing sex-related pharmacokinetic variability.
Fazzini, B.; STEPHENS, T.; Pickles, F.; Mathieson, G.; Pattison, R.; Kelly, E.; Nazeer, S.; Heunks, L.; Doorduin, J.; puthucheary, z.
Show abstract
BackgroundPatients with acute respiratory failure requiring non-invasive respiratory support are at high risk of deterioration. Different advanced monitoring instruments are available that can provide objective measurements. However, there is currently no evidence synthesis on these instruments. The aim of this project is to systematically synthetise data identifying the advanced monitoring instruments used and their effectiveness. MethodsWe conducted a systematic search of MEDLINE (via Pubmed), EMBASE, Web of Science, Cochrane Library and CINAHL (PROSPERO registration: CRD42024597047). We included studies with acute respiratory failure patients requiring non-invasive respiratory support where the investigators used advanced monitoring instrument during hospital stay. We followed the Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) guidelines. ResultsSeventy-eight studies including 3709 patients fulfilled the selection criteria. The monitoring instruments used were respiratory muscle ultrasound in 32% (n= 25/78), oesophageal manometry in 32% (n= 25/78), electrical impedance tomography in 24% (n= 19/78), electrical activity of the diaphragm (Eadi) catheter in 18% (n= 14/78) and surface EMG of parasternal muscle in 6% (n= 5/78). Thirteen studies (17%) used a multi-modal monitoring approach. Patients failing non-invasive respiratory support showed higher oesophageal pressure ({Delta}Pes) [MD 12.60 (95% CI 4.03;21.16), p=0.004], lung ultrasound score (LUS) [MD 3.93 (95% CI 1.29,6.570), p=0.003] and parasternal intercostal thickening fraction (PIC-TF%) [MD 12.58 (95% CI 8.02,17.13), p<0.001] but lower diaphragmatic thickening fraction (DTF%) [MD - 17.20 (95% CI -20.97,-13.42); p<0.001] and lower diaphragmatic excursion (DE) [MD - 0.95 (95% CI -1.08,-0.82); p<0.001. ConclusionAdvanced monitoring instruments may detect patient failing non-invasive respiratory support. Take home messageAdvanced bedside monitoring during non-invasive respiratory support can provide unique physiological insights into respiratory muscle workload and treatment response in acute respiratory failure. Our meta-analysis shows that five measurements: i) oesophageal pressure changes ({Delta}Pes), ii) lung ultrasound score (LUS), iii) parasternal intercostal thickening fraction (PIC-TF%), iv) diaphragmatic thickening fraction (DTF%) and v) diaphragmatic excursion (DE) may discriminate patients who are responders from non-responders to non-invasive respiratory support.
Kunche, N.
Show abstract
Background: Severity scoring systems such as SOFA, NEWS2, and qSOFA effectively identify deteriorating ICU patients by aggregating physiological parameters into composite indices that trigger clinical alerts. However, these systems evaluate patient state at discrete time points and do not model the temporal dynamics of organ deterioration or the pharmacokinetic constraints that govern whether a given intervention can achieve therapeutic effect before an organ trajectory crosses an irreversible threshold. This limitation is consequential because interventions across critical care span pharmacokinetic onset times from seconds (vasopressors) to hours (metabolic corrections, blood products, enzymatic cofactors), yet no existing framework quantifies timing adequacy as a function of these intervention-specific pharmacokinetic properties. Methods: We developed the Multi-Organ Intervention State Space (MOISS), a collision geometry framework that classifies intervention timing adequacy by computing the temporal relationship between the predicted time for a biomarker trajectory to reach a critical threshold and the time required for the administered intervention to achieve peak therapeutic effect. Biomarker trajectories were estimated using the Kunche Adaptive Estimator (KAE), a reliability-adaptive Kalman filter that provides continuous position and velocity estimates from intermittent laboratory measurements. MOISS assigns each intervention event to one of six ordinal categories: PROPHYLACTIC, ON_TIME, PARTIAL, MARGINAL, FUTILE, or TOO_LATE. We applied this framework to 301,470 ICU patients across three databases (eICU-CRD, MIMIC-IV, MIMIC-III), evaluating 65 distinct intervention-organ pairs spanning 10 organ systems: Cardiovascular, Metabolic, Respiratory, Renal, Hematologic, Hepatic, Gastrointestinal, Infection, Endocrine, and Neurological. Results: Timing-mortality associations were identified across all 10 organ systems, with 87 intervention-database combinations achieving statistical significance (p<0.05). The highest timing sensitivity was observed in metabolic corrections: thiamine supplementation for metabolic acidosis (OR 5.76; 95% CI 4.86-6.83 in MIMIC-IV), sodium bicarbonate (OR 4.99; 95% CI 4.27-5.82 in MIMIC-IV). Respiratory interventions showed comparable magnitude: mechanical ventilation initiation (OR 5.03; 95% CI 4.42-5.73 in MIMIC-IV). Hematologic interventions demonstrated strong timing dependency: platelet transfusion (OR 4.25; 95% CI 3.68-4.90), fresh frozen plasma (OR 3.41; 95% CI 2.94-3.95). Cardiovascular agents ranged from OR 1.40 for norepinephrine (consistent with its rapid 1-2 minute onset providing a forgiving therapeutic window) to OR 2.23 for milrinone. Infection-directed therapies, hepatic support, renal replacement, endocrine correction, gastrointestinal interventions, and neurological agents all contained timing-sensitive members. Cross-database consistency was demonstrated for 29 of 52 testable interventions (55.8%), with 6 interventions achieving significance across all three databases. Conclusions: Intervention timing sensitivity is pervasive across the entire spectrum of critical care therapeutics, spanning all 10 organ systems and all pharmacokinetic classes evaluated. MOISS provides a systematic framework for quantifying this timing adequacy that complements existing severity scoring by adding the pharmacokinetic timing dimension: where SOFA, NEWS2, and qSOFA identify that a patient is deteriorating, MOISS computes whether the specific planned intervention can still achieve its intended effect given the current organ trajectory and pharmacokinetic constraints. The universality of timing sensitivity across organ systems argues for multi-organ trajectory monitoring as the foundation for next-generation clinical decision support.
Batista, N. O. W.; Fiori, H. H.; Knop, N. C. F.
Show abstract
IntroductionHyperferritinemia is a prognostic marker in critical illness, but its role in postoperative outcomes of pediatric congenital heart defects remains poorly defined, especially in resource-limited settings. This study evaluated early serum ferritin as a predictor of outcomes after congenital heart surgery and its association with the PIM 3 score. MethodsA single-center, prospective cohort study was conducted from April 2023 to October 2024 at a tertiary referral center in southeastern Brazil. Patients aged 29 days to 18 years, of both sexes, admitted to the PICU after congenital heart surgery were included and categorized as cyanotic or acyanotic. Statistical significance was defined as two-sided p < 0.05. ResultsA total of 105 patients were included. Median ferritin was higher in patients with PICU stays < 7 days (183 ng/mL; p = 0.004) and was significantly associated with a PIM 3 score [≥] 5% (642 ng/mL; p < 0.006). Cyanotic patients had longer PICU stays (11.0 vs. 7.2 days; p = 0.02), longer use of vasoactive drugs (3.8 vs. 2.6 days; p = 0.01), and accounted for all deaths (p < 0.001). Hemoglobin and hematocrit were also significantly higher in cyanotic patients (14 vs. 13 g/dL and 40% vs. 37%; p < 0.001). ConclusionsSerum ferritin may serve as a marker of secondary outcomes and aid early risk stratification in congenital heart defects patients in the PICU.
Shafreen, M.; Chakraborty, M.; Patil, L.; Navamani, S.; Shema, E.; Pujari, D.; More, S.; Satish, D.
Show abstract
BackgroundHeart failure (HF) is a frequent and severe complication among patients with chronic kidney disease (CKD), particularly in advanced stages and end stage renal disease (ESRD). This study focuses on understanding the molecular interplay between CKD and HF beyond the context of maintenance hemodialysis (MHD). Given that peripheral blood mononuclear cells (PBMCs) reflect systemic inflammatory and transcriptional alterations, we analyzed PBMC transcriptomes to uncover potential biomarkers and mechanistic links connecting CKD and HF. MethodsPublicly available RNA Seq data comprising PBMCs from 15 CKD patients with HF (SRX23265333) and 14 healthy controls (SRX19031772) were analyzed. Quality control was performed using FastQC and Fastp, followed by alignment to the human reference genome with HISAT2. Gene counts were normalized, and differential expression was determined using DESeq2. Functional enrichment analyses (Gene Ontology and KEGG) identified key biological pathways. Protein protein interaction (PPI) networks were constructed using STRING, and hub genes were validated through disease and gene associations in the Comparative Toxicogenomics Database (CTD). ResultsDifferential expression analysis revealed several genes significantly dysregulated in CKD patients with HF compared to controls. Enrichment results highlighted processes associated with extracellular matrix remodeling, immune activation, and cardiac renal fibrosis. PPI analysis identified four major hub genes CCL2, ALB, EGFR, and COL1A2 as central nodes within the network. These genes are functionally linked to inflammatory signaling, vascular remodeling, and fibrotic progression, consistent with pathophysiological mechanisms of HF and CKD. CTD validation further confirmed their association with cardiorenal dysfunction. DiscussionThis integrative transcriptomic study identifies CCL2, ALB, EGFR, and COL1A2 as key PBMC expressed hub genes linking CKD and HF. The findings enhance understanding of the molecular basis of cardiorenal syndrome and propose candidate biomarkers and therapeutic targets for future translational research.
Pinheiro Da Silva, F.
Show abstract
Antimicrobial peptides (AMPs) are essential components of the innate immune system, exhibiting diverse mechanisms of action. This study investigates the roles of cathelicidin (LL-37), alpha-defensins, and the S100 proteins S100A8 and S100A9 in systemic inflammation associated with sepsis, severe COVID-19, and acute pancreatitis using whole-blood bulk RNA-sequencing data. Gene co-expression network analysis revealed that during septic shock and severe COVID-19, cathelicidin and alpha-defensins act synergistically in innate immune responses, while S100A8 and S100A9 function through distinct pathways related to mitochondrial metabolism and ubiquitin ligase binding. In contrast, the acute pancreatitis network displayed a different pattern, with CAMP co-expressed alongside S100A8 and S100A9, whereas alpha-defensins were downregulated and associated with inhibited mucosal immune responses. These findings suggest that antimicrobial peptides contribute variably to systemic inflammation depending on the underlying insult, underscoring their complex, context-dependent roles in critical illness.
Joseph, A.; Ricard, J.-D.; de Margerie-Mellon, C.; Walter, T.
Show abstract
ObjectiveTo assess whether focal and non-focal COVID-19 ARDS exhibit different respiratory mechanics and arterial blood gas (ABG) trajectories during the first extended prone positioning (PP) session. DesignPost-hoc analysis of a previously published retrospective monocentric cohort study. SettingA university-affiliated intensive care unit in Paris (France) between March 2020 and April 2021. PatientsSeventy-four adult patients with moderate-to-severe COVID-19 ARDS who underwent extended prone positioning (PP) and had chest imaging (CT or X-ray) performed within five days before the first PP session. InterventionsNone MeasurementsChanges in compliance, driving pressure, PaO2/FiO2 ratio, and ventilatory ratio between pre-PP and end-of-PP; chest X-rays and CT scans diagnostic agreement between reviewers assessed by crude agreement and Cohens kappa coefficient. Main resultsDiffuse ARDS predominated, identified in 91% by the intensivist and 86% by the radiologist. Crude diagnostic agreement was 89% (95% CI 79.8-95.2), while interrater reliability was only fair ({kappa} = 0.44, 95% CI 0.08-0.81). Discrepancies mainly involved focal classifications on chest X-rays, whereas agreement was perfect when CT scans were available. The small number of focal cases precluded comparative analysis of PP trajectories. ConclusionsPP indication or duration should probably not be based on ARDS phenotypes and when differentiating focal from diffuse COVID ARDS is necessary, chest CT scans should be preferred to ensure accurate and reproducible phenotyping.
de Prost, N.; Bay, P.; Le Goff, M.; Preau, S.; Guigon, A.; Beloncle, F. M.; Lefeuvre, C.; Dartevel, A. i.; Larrat, S.; Coudroy, R.; Deroche, L.; Darreau, C.; Thomin, J.; Aubron, C.; Tran, A.; Uhel, F.; Le Hingrat, Q.; Tamion, F.; Moisan, A.; Guillon, A.; Handala, L.; Souweine, B.; Henquell, C.; Klouche, K.; Tuaillon, E.; Damoisel, C.; Roque Afonso, A. M.; Gault, E.; Cappy, P.; Soulier, A.; Pawlotsky, J. M.; Lemoine, F.; Rameix Welti, M. A.; Audureau, E.; Fourati, S.; SEVARVIR consortium,
Show abstract
ImportanceRecent reports have highlighted an intense influenza activity related to the circulation of the influenza A(H3N2) subclade k variant. There is no data available on the impact of the emergence of H3N2 subclade k on the severity of the 2025-2026 epidemic or on the clinical phenotype of patients requiring admission to the intensive care unit (ICU). ObjectiveTo compare the clinical presentation, hospital mortality and virological characteristics of patients with laboratory-confirmed influenza infection included in French intensive care units during the 2025-2026 epidemic season with those of patients admitted during the 2024-2025 season. We also aimed at measuring the impact of the A(H3N2) subtype on hospital mortality during the 2025-2026 season. DesignProspective, multicenter, observational SEVARVIR cohort study including patients admitted during the 2024-2025 and 2025-2025 influenza seasons. SettingForty-two French ICUs ParticipantsAdult patients with laboratory-confirmed influenza infection Interventionsnone Main Outcomes and MeasuresThe primary outcome measure was in-hospital mortality. ResultsPatients admitted in intensive care units for influenza in 2024-2025 (n=360) and 2025-2026 (n=325) were included in the French nationwide prospective multicentre SEVARVIR study. There was no significant difference in day-28 mortality between the seasons (12.7%, n=45/355 vs 16.5% n=28/170; p=0.28). In the 2025-26 season, 49% had the A(H1N1) subtype and 51% the A(H3N2) subtype (k subclade: 77%). The univariable Cox analysis revealed that patients infected with A(H3N2) viruses were at greater risk of death over time. Multivariable Cox analysis revealed that during the 2025-2026 season, age (adjusted hazard ratio, aHR=1.05 [1.00;1.11]; p=0.046) and the clinical frailty scale (aHR=1.82 [1.26;2.72]; p=0.001) were associated with an increased risk of death. The A(H3N2) subtype was not associated with an increased risk of death (aHR=1.13 [0.32;4.51]; p=0.85). Phylogenetic analyses from our ICU cohort together with 300 contextual sequences from community-acquired influenza cases collected during the same period showed no clustering according to severity. Conclusions and RelevanceThis French national prospective observational study, found that the emergence of the influenza A(H3N2) subclade K was associated with an increased risk of death in univariable but not multivariable analysis, adjusting for host-related factors. Trial RegistrationNCT051625 Key PointsQuestion: What impact did the 2025-26 influenza epidemic and the A(H3N2) variant have on the mortality of patients admitted to intensive care units? Findings: In this prospective, nationwide cohort study of 685 patients admitted to intensive care units with severe influenza during the 2024-25 or 2025-26 seasons, no difference in hospital mortality was observed between the two seasons. Patients infected with the A(H3N2) virus, 77% of which corresponded to clade k, were at higher risk of death in univariable but not in multivariable analysis after adjusting for age and clinical frailty scale. Meaning: Patients in intensive care units with severe A(H3N2) infection during the 2025/2026 season were not at higher risk of death after adjusting for confounding variables.
Gehring, M.
Show abstract
BackgroundPulse oximeters are typically validated on cohorts of 200-500 subjects under controlled conditions. Whether these cohorts capture the demographic heterogeneity of national clinical practice -- and whether measurement error is associated with patient outcomes -- has not been established at scale. MethodsWe analyzed paired SpO2/SaO2 readings from three independent sources spanning 209 U.S. hospitals: MIMIC-IV (1 hospital; 12,934 ICU stays), eICU-CRD (208 hospitals; 55,178 stays), and the Open Oximetry Repository (PhysioNet; 52.4 million readings with continuous melanin and perfusion indices). Bias was defined as SpO2 - SaO2. Hidden hypoxemia (SpO2 [≥] 94% with SaO2 < 88%) was assessed per ICU stay. Mortality was compared between hidden-hypoxemia-positive and -negative stays with multivariable logistic regression adjusting for age, sex, race, and four laboratory severity markers (cluster-robust SEs by hospital). Sensitivity analyses included landmark restriction (first 48 hours), lactate stratification, alternate thresholds, and patient-level aggregation. PPG signal quality was assessed in 125 ICU patients with demographic-linked waveform data. ResultsBias was minimal at normal perfusion but amplified under low perfusion in high-melanin patients, consistent with known optics: at very low perfusion x high melanin x severe hypoxia, mean bias reached +12.8% (n = 458,571), with 47% of readings constituting hidden severe hypoxemia. National bias in African American patients was +2.76% (n = 529,541; 208 hospitals), 62% higher than academic estimates. Across 55,178 eICU stays, hidden hypoxemia was associated with approximately doubled mortality after adjustment for age, sex, race, and illness severity (adjusted OR 1.86, 95% CI 1.69-2.04, p < 0.001), consistent across all racial groups. Hidden hypoxemia was not a pre-terminal phenomenon: 63% of events occurred >48 hours before death (median first event: 15.3 hours; mean time to death: 151 hours), and the association persisted in landmark analysis (first 48 hours only), in patients with normal lactate (adjusted OR 1.87, 95% CI 1.61-2.16), and when both restrictions were applied simultaneously (16.5% vs. 11.1%). Waveform analysis (n = 125) showed no fixed racial difference in baseline PPG AC/DC ratio (Black: 0.299, White: 0.273), suggesting the signal deficit is conditional on perfusion state. Full extraction (n = 1,545) is in progress. ConclusionsIn this multicenter retrospective analysis, national pulse oximetry variance exceeded published benchmarks and was associated with approximately doubled ICU mortality, replicated across 209 U.S. hospitals. Hidden hypoxemia was not a pre-terminal artifact: events occurred throughout the ICU stay at a constant rate, and mortality associations persisted in landmark and lactate-stratified analyses. These findings suggest that current regulatory validation standards may underestimate the real-world prevalence of demographic bias in pulse oximetry, and that perfusion-dependent mechanisms may offer a target for algorithmic correction.
Cistero, B.; Monforte, V.; Camprubi-Rimblas, M.; Areny-Balaguero, A.; Campana-Duel, E.; Fernandez, A.; Casabella Pernas, A.; Nuez Zaragoza, E.; Martin, I.; Tomas, A.; Minarro, I.; Vila, M.; Cuevas, M.; Sanchez, M.; Belda, X.; Lopez Rodriguez, M.; Teles, T.; Savone, M. F.; Stable, C.; Salom Merce, P.; Guijarro Viudez, C.; Tajan, J.; Goma Fernandez, G.; Martinez, M. L.; Kramer, L.; van Amstel, R.; Diaz Santos, E.; Blanch, L.; Gene Tous, E. M.; Bos, L.; Artigas Raventos, A.; Ceccato, A.
Show abstract
Sepsis is a complex condition with a time-dependent evolution. Longitudinal biomarker dynamics could help us to better characterise sepsis. We hypothesised that the kinetics of biomarkers are associated with sepsis and with the intensity of organ dysfunction, and may have predictive capacity for patient survival. This single-centre, prospective, observational study included adult patients presenting to the Emergency Department (ED) with suspected infection. Patients were included in the study if they had a National Early Warning Score 2 (NEWS 2) of 3 or higher. Blood samples were obtained at baseline, 4hs and 24 hs. Linear mixed models were constructed to analyse the association between biomarker concentrations over time, sepsis diagnosis and organ dysfunction severity. Joint models were used to evaluate the predictive ability of individual biomarker kinetics during the first 24 hours for in-hospital mortality Of 214 screened patients, 173 patients were analysed, and 137 (79%) developed sepsis. Linear mixed models revealed time-dependent decreases in IL10 ({beta} -0.016, 95%CI -0.028 to -0.004), IL1RN ({beta} -0.014, 95%CI -0.024 to -0.004), and IL6 ({beta} -0.012, 95%CI -0.024 to 0.00). Sepsis was associated with higher IL1RN ({beta} 0.378, 95%CI 0.153-0.603), and TNFRSF1A ({beta} 0.40, 95%CI 0.21-0.58); only models evaluating IL6 showed significant interaction between sepsis and time ({beta} -0.14, 95%CI -0.028 to 0.00). SOFA correlated with elevated IL10 ({beta} 0.048, 95%CI 0.021-0.075), IL1RN ({beta} 0.044, 95%CI 0.017-0.071), CCL2 ({beta} 0.046, 95%CI 0.021-0.071), TNFRSF1A ({beta} 0.050, 95%CI 0.030-0.070), and PCT ({beta} 2.63, 95%CI 1.32-3.93); the interaction between SOFA score and time was significant only for IL6 ({beta} -0.003, 95%CI -0.005 to -0.001). Joint survival models (adjusted for age and highest SOFA) identified IL8 (HR 0.655, 95% CrI 0.582-0.728), TNFRSF1A (HR 0.505, 95% CrI 0.419-0.682), and PCT (HR 1.004, 95% CrI 1.001-1.008) as predictors. ConclusionSepsis diagnosis and severity of organ dysfunction may be associated with higher levels and kinetic values of inflammatory biomarkers such as IL1RN and TNFRSF1A. IL6 levels showed a significant association for the interaction of time with both sepsis diagnosis and SOFA score. TNFRSF1A, IL8 and PCT dynamics were found to be associated with survival and could be useful in developing prognosis tools.
liu, d.; Sun, Y.
Show abstract
BACKGROUNDSoluble ST2 (sST2) predicts poor outcomes in heart failure (HF) and sepsis, but does it actually drive these conditions or just tag along for the ride? We used bidirectional Mendelian randomization (MR) to find out, testing six directional causal pathways among sST2, HF, and sepsis. METHODS AND RESULTSOur approach involved bidirectional two-sample Mendelian Randomization(MR) analyses drawing on genome-wide association study (GWAS) summary statistics: sST2 data came from deCODE Genetics (n=30,931), HF data from the Heart Failure Molecular Epidemiology for Therapeutic Targets(HERMES) Consortium (n=977,323), and sepsis data from FinnGen R12 (n=500,348) . We looked at six directions. IVW was the main method, with (Mendelian Randomization-Egger regression)MR-Egger, weighted median, (Mendelian Randomization Pleiotropy RESidual Sum and Outlier)MR-PRESSO sensitivity tests, and multivariable MR (MVMR). We did a cis-SNP analysis too, using only IL1RL1 variants. None of the six pathways showed a causal effect. Genetically predicted sST2 had no link to sepsis (OR: 1.01; 95% CI: 0.94-1.08; P=0.869) or HF (OR: 0.99; 95% CI: 0.92-1.07; P=0.867).cis-SNP analysis gave the same answer for sST2[->]sepsis (OR: 1.03; 95% CI: 0.98-1.09; P=0.223). MVMR adjusting for HF: still nothing (OR: 1.01; 95% CI: 0.93-1.09; P=0.862). HF and sepsis did not cause each other either - HF[->]sepsis (OR: 0.98; 95% CI: 0.79-1.23; P=0.882), sepsis[->]HF (OR: 1.07; 95% CI: 0.96-1.20; P=0.236). Going the other way, genetic risk for HF or sepsis did not affect sST2 levels. CONCLUSIONSWe found no evidence that sST2 causes HF or sepsis. The picture that emerges is one where sST2 goes up because patients are sick-not the other way around. This makes sST2 a useful prognostic signal but probably not something worth targeting with drugs.We also found no direct causal link between HF and sepsis. Clinical PerspectiveO_ST_ABSWhat Is New?C_ST_ABSThis bidirectional Mendelian randomization study is the first to comprehensively examine potential causal relationships among soluble ST2, heart failure, and sepsis using genetic instruments. We found no evidence that genetically predicted sST2 levels causally influence the risk of heart failure or sepsis, despite strong observational associations reported in clinical studies. Similarly, genetic liability to heart failure or sepsis does not appear to causally affect circulating sST2 levels, suggesting sST2 elevation is a consequence rather than a cause of these conditions. What Are the Clinical Implications?These findings suggest that sST2 functions primarily as a prognostic biomarker reflecting disease severity rather than as a driver of pathophysiology, which has implications for its use in clinical decision-making. The strong observational associations between sST2 and adverse cardiovascular outcomes likely reflect confounding or reverse causation rather than direct causal effects. Drug development efforts targeting the ST2/IL-33 signaling pathway should consider that modulating sST2 levels may not directly prevent heart failure or improve sepsis outcomes.
Armenta Salas, M.; Zhang, A.; Girard, T. D.; Devlin, J. W.; Barr, J.
Show abstract
BACKGROUNDDelirium is common in critically ill adults but often goes unrecognized and undertreated. Little is known about the perceptions of ICU nurse and physician leaders regarding ICU delirium detection and management and the potential role of objective continuous delirium monitoring to facilitate ICU delirium care. RESEARCH QUESTIONWhat are the perceptions of ICU leaders regarding the current challenges associated with delirium recognition and management and the potential benefits of continuous delirium monitoring? STUDY DESIGN AND METHODSWe conducted a blinded, cross-sectional, electronic survey of ICU leaders across the U.S., including physician directors and nursing managers with [≥]3 years of ICU leadership experience. We asked about perceptions of the effectiveness of current delirium clinical assessment tools, current delirium detection and management challenges, and how an objective, continuous delirium monitoring system might impact clinician practice and patient outcomes in their ICU. RESULTSAmong the 81 respondents (62 physicians, 19 nurses), most (76%) reported that recommended delirium assessment tools (CAM-ICU, ICDSC) are used in their ICUs, though there were mixed perceptions on how reliably they are conducted. A majority (63-90%) perceived that current bedside assessments delay and limit the recognition of ICU delirium. Nearly all (89%) agreed an objective delirium monitoring tool would be more clinically valuable than current delirium assessment tools and that it would support real-time, delirium management by clinicians. CONCLUSIONSICU leaders perceive that there are limitations to using clinical delirium assessment tools in ICU patients to effectively detect and manage ICU delirium. Most felt that an objective delirium monitor could facilitate delirium detection and potentially expedite appropriate delirium management in patients.
Tjepkema-Cloostermans, M. C.; Beishuizen, A.; Strang, A. C.; Keijzer, H. M.; Telleman, J. A.; Smook, S. P.; Vermeijden, J. W.; Hofmeijer, J.; van Putten, M. J. A. M.
Show abstract
ObjectiveDespite substantial variability in the severity of post-anoxic encephalopathy, all comatose patients after cardiac arrest are usually treated according to the same standardized intensive care protocol, including sedation, mechanical ventilation, and targeted temperature management (TTM). We hypothesize that patients with a favourable EEG pattern (continuous EEG within 12 hours after cardiac arrest) may not benefit from prolonged sedation and TTM. We studied the feasibility and safety of early cessation of sedation and TTM in this subgroup. MethodsWe conducted a non-randomized, controlled intervention study including 40 adult patients admitted to the ICU with postanoxic encephalopathy after cardiac arrest and an early (< 12 hours) favourable EEG pattern. The control group received standard care with sedation and TTM for at least 24-48 hours, whereas the intervention group underwent early cessation of sedation and TTM as soon as possible after establishing a favourable EEG, followed by weaning from mechanical ventilation. The primary outcome was duration of mechanical ventilation. Secondary outcomes included ICU length of stay, total sedation time, number of ICU complications, and neurological outcomes at 3 and 6 months. ResultsDuration of mechanical ventilation was significantly shorter in the intervention than in the control group (median 12 vs 28 h, p < 0.001). Median ICU length of stay and median total sedation time were also reduced by more than 50% in the intervention group, from respectively 2.5 to 1.2 days (p = 0.001) and 27 to 12 h (p < 0.001). There was no increase in ICU complications in the intervention group. No statistically significant differences in neurological outcomes at 3 or 6 months were observed. ConclusionEarly withdrawal of sedation is feasible and safe in patients with an early favourable EEG following cardiac arrest. The study was underpowered to detect possible differences in long-term neurological recovery. SignificanceShortening sedation and mechanical ventilation is likely to result in direct reductions in healthcare costs and contribute to more appropriate care. Larger studies are needed to evaluate the impact on long-term neurological outcomes.